Marius Pachitariu: Learning visual motion in recurrent neural networks
Lecture by Marius Pachitariu, Gatsby Institute, University College London. Given to the Redwood Center for Theoretical Neuroscience at UC Berkeley.
Audio/Visual sound, color
We present a dynamic nonlinear generative model for visual motion based on a latent representation of binary-gated Gaussian variables connected in a network. Trained on sequences of images by an STDP-like rule the model learns to represent different movement directions in different variables. We use an online approximate inference scheme that can be mapped to the dynamics of networks of neurons. Probed with drifting grating stimuli and moving bars of light, neurons in the model show patterns of responses analogous to those of direction-selective simple cells in primary visual cortex. We show how the computations of the model are enabled by a specific pattern of learnt asymmetric recurrent connections. I will also briefly discuss our application of recurrent neural networks as statistical models of simultaneously recorded spiking neurons.